Asymptotic normality of the integrated square error of a density estimator in the convolution model

نویسنده

  • C. Butucea
چکیده

In this paper we consider a kernel estimator of a density in a convolution model and give a central limit theorem for its integrated square error (ISE). The kernel estimator is rather classical in minimax theory when the underlying density is recovered from noisy observations. The kernel is fixed and depends heavily on the distribution of the noise, supposed entirely known. The bandwidth is not fixed, the results hold for any sequence of bandwidths decreasing to 0. In particular the central limit theorem holds for the bandwidth minimizing the mean integrated square error (MISE). Rates of convergence are sensibly different in the case of regular noise and of super-regular noise. The smoothness of the underlying unknown density is relevant for the evaluation of the MISE. MSC: 62G05, 62G20

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asymptotic Behaviors of Nearest Neighbor Kernel Density Estimator in Left-truncated Data

Kernel density estimators are the basic tools for density estimation in non-parametric statistics.  The k-nearest neighbor kernel estimators represent a special form of kernel density estimators, in  which  the  bandwidth  is varied depending on the location of the sample points. In this paper‎, we  initially introduce the k-nearest neighbor kernel density estimator in the random left-truncatio...

متن کامل

On the estimation of the marginal density of a moving average process

The authors present a new convolution-type kernel estimator of the marginal density of an MA(1) process with general error distribution. They prove the √ n-consistency of the nonparametric estimator and give asymptotic expressions for the mean square and the integrated mean square error of some unobservable version of the estimator. An extension to MA(q) processes is presented in the case of th...

متن کامل

Some Asymptotic Results of Kernel Density Estimator in Length-Biased Sampling

In this paper, we prove the strong uniform consistency and asymptotic normality of the kernel density estimator proposed by Jones [12] for length-biased data.The approach is based on the invariance principle for the empirical processes proved by Horváth [10]. All simulations are drawn for different cases to demonstrate both, consistency and asymptotic normality and the method is illustrated by ...

متن کامل

Optimal convergence rates, Bahadur representation, and asymptotic normality of partitioning estimators

This paper studies the asymptotic properties of partitioning estimators of the conditional expectation function and its derivatives. Mean-square and uniform convergence rates are established and shown to be optimal under simple and intuitive conditions. The uniform rate explicitly accounts for the effect of moment assumptions, which is useful in semiparametric inference. A general asymptotic in...

متن کامل

A Goodness-of-fit test for GARCH innovation density

We prove asymptotic normality of a suitably standardized integrated square difference between a kernel type error density estimator based on residuals and the expected value of the error density estimator based on innovations in GARCH models. This result is similar to that of Bickel-Rosenblatt under i.i.d. set up. Consequently the goodness-of-fit test for the innovation density of the GARCH pro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004